Fullstack Job: Data Engineer

Job added on

Company

INVENT.us
Canada

Location

Remote Position
(From Everywhere/No Office Location)

Job type

Full-Time

Fullstack Job Details

About INVENT.us

INVENT is an innovative software development consulting firm founded by industry veteran, Oleg Tishkevich and an elite team of Cloud technologists. Purpose built to assist financial organizations to modernize their advisor technology stack, INVENT is transforming how financial services operate. For years, financial organizations have been hard at work trying to upgrade their systems to stay current with today’s more complex and rapidly changing demands, however, the pace of change is now faster than their ability to keep up.

Which is why we created INVENT – to leverage our unique combination of domain expertise, design and technical implementation, combined with broad connections throughout the wealth management industry to simplify complex system architecture and get our clients fully Cloud-native.

Through our innovative software development model, organizations can quickly streamline their technology infrastructure, legacy systems, simplify API’s, and integrate siloed applications and systems into a unified Cloud ecosystem.

Full-time, remote, independent contractor agreement

Our team is growing, and now we are looking for a passionate data engineer to help with the design and implementation of cloud platform capabilities. You will work collaboratively within the distributed team and solve complex technical challenges.

Responsibilities

We are looking for a Senior Developer to join our Data Hub Platform core team. This specialist will contribute to the development of the Multi-Tenant Data Hub Integration Platform from various angles.

· Creation core integration service which seamlessly works with Invent.us Identity Provider and Access management services.

· Development integration adapter SDK.

· Development administration adapter capabilities and integrate it with Front End Invent.us core application

· Creation of a pipeline within the integration adapter concept.

· Analyze external data sources and help in data mapping.

Job Requirements

· Cloud: Azure, AWS

· Programming Langages: Scala, Java, Python, SQL

· Integration: ELT, pipelines, streaming, batches, micro batches, integration enterprise patterns

· Data Modeling: data vault, data warehouse, big data, data lake, dimensional modeling

· Services: Temporal, micro services, Spring Framework, Databricks, Kubernetes, Docker, Kafka, queues

Job Type: Full-time

Schedule:

  • Monday to Friday

Work Location: Remote